Search results for "data quality"
showing 10 items of 96 documents
Uncertainty assessment of a membrane bioreactor model using the GLUE methodology
2010
A mathematical model for the simulation of physical-biological organic removal by means of a membrane bioreactor (MBR) has been previously developed and tested. This paper presents an analysis of the uncertainty of the MBR model. Particularly, the research explores the applicability of the Generalised Likelihood Uncertainty Estimation (GLUE) methodology that is one of the most widely used methods for investigating the uncertainties in the hydrology and that now on is spreading in other research field. For the application of the GLUE methodology, several Monte Carlo simulations have been run varying the all model influential parameters simultaneously. The model was applied to an MBR pilot pl…
Humanities and Social Sciences. Latvia: Vol. 26, N 2 (Autumn-Winter 2018)
2018
The Reprocessed Proba-V Collection 2: Product Validation
2021
With the objective to improve data quality in terms of cloud detection, absolute radiometric calibration and atmospheric correction, the PRoject for On-Board Autonomy-Vegetation (PROBA-V) data archive (October 2013 - June 2020) will be reprocessed to Collection 2 (C2). The product validation is organized in three phases and focuses on the intercomparison with PROBA-V Collection 1 (C1), but also consistency analysis with SPOT-VGT, Sentinel-3 SYN-VGT, Terra-MODIS and METOP-AVHRR is foreseen. First preliminary results show the better performance of cloud and snow/ice masking, and indicate that statistical consistency between PROBA-V C2 and C1 are in line with expectations. PROBA-V C2 data are …
Electronic system for assessing and analysing digital competences in the context of Knowledge Society
2019
The digital competencies of users, in general, and of public administration personnel, in particular, must allow the people access to graphical interfaces and applications, including databases. The use of digital competences is not simply about the emergence and use of computer tools to a certain level but also to develop new digital skills to face the challenges of data quality, responsibility, confidence in information received, the privacy of the user and the safety of its data. The study tool in this paper is managed and applied online, and the collection and centralization of data is done automatically, also ensuring information security. The basic method used in this paper is the mode…
The Performance of Belle II High Level Trigger in the First Physics Run
2020
The Belle II experiment is a new generation B-factory experiment at KEK in Japan aiming at the search for New Physics in a huge sample of B-meson decays. The commissioning of the accelerator and the detector for the first physics run has started from March this year. The Belle II High Level Trigger (HLT) is fully working in the beam run. The HLT is now operated with 1600 cores clusterized in 5 units, which is 1/4 of the full configuration. The software trigger is performed using the same offline reconstruction code, and events are classified into a set of physics categories. Only the events in the categories of interest are finally sent out to the storage. Live data quality monitoring is also…
UNIVERSITY IS ARCHITECTURE FOR THE RESEARCH EVALUATION SUPPORT
2017
The measuring of research results can be used in different ways e.g. for assignment of research grants and afterwards for evaluation of project’s results. It can be used also for recruiting or promoting research institutions’ staff. Because of a wide usage of such measurement, the selection of appropriate measures is important. At the same time there does not exist a common view which metrics should be used in this field, moreover many existing metrics that are widely used are often misleading due to different reasons, e.g. computed from incomplete or faulty data, the metric’s computation formula may be invalid or the computation results can be interpreted wrongly. To produce a good framewo…
An Approach to Data Quality Evaluation
2018
This research proposes a new approach to data quality evaluation comprising 3 aspects: (1) data object definition, which quality will be analyzed, (2) quality requirements specification for the data object using Domain Specific Language (DSL), (3) implementation of an executable data quality model that would enable scanning of data object and detect its shortages. Like the Model Driven Architecture (MDA) the data quality modelling is divided into platform independent (PIM) and platform-specific (PSM) models. PIM comprises informal specifications of data quality, PSM describes implementation of data quality model, thus making the data quality model executable. The approbation of the proposed…
Optimising experimental research in respiratory diseases: an ERS statement
2018
Experimental models are critical for the understanding of lung health and disease and are indispensable for drug development. However, the pathogenetic and clinical relevance of the models is often unclear. Further, the use of animals in biomedical research is controversial from an ethical perspective.The objective of this task force was to issue a statement with research recommendations about lung disease models by facilitating in-depth discussions between respiratory scientists, and to provide an overview of the literature on the available models. Focus was put on their specific benefits and limitations. This will result in more efficient use of resources and greater reduction in the numb…
ATLAS tile calorimeter data quality assessment with commissioning data
2008
TileCal is the barrel hadronic calorimeter of the ATLAS experiment presently in an advanced state of installation and commissioning at the LHC accelerator. The complexity of the experiment, the number of electronics channels and the high rate of acquired events requires a detailed commissioning of the detector, during the installation phase of the experiment and in the early life of ATLAS, to verify the correct behaviour of the hardware and software systems. This is done through the acquisition, monitoring, reconstruction and validation of calibration signals as well as processing data obtained with cosmic ray muons. To assess the detector status and verify its performance a set of tools ha…
Data Quality Model-based Testing of Information Systems
2020
This paper proposes a model-based testing approach by offering to use the data quality model (DQ-model) instead of the program’s control flow graph as a testing model. The DQ-model contains definitions and conditions for data objects to consider the data object as correct. The study proposes to automatically generate a complete test set (CTS) using a DQmodel that allows all data quality conditions to be tested, resulting in a full coverage of DQ-model. In addition, the possibility to check the conformity of the data to be entered and already stored in the database is ensured. The proposed alternative approach changes the testing process: (1) CTS can be generated prior to software developmen…